43 research outputs found

    Simulation of Laser Propagation in a Plasma with a Frequency Wave Equation

    Get PDF
    The aim of this work is to perform numerical simulations of the propagation of a laser in a plasma. At each time step, one has to solve a Helmholtz equation in a domain which consists in some hundreds of millions of cells. To solve this huge linear system, one uses a iterative Krylov method with a preconditioning by a separable matrix. The corresponding linear system is solved with a block cyclic reduction method. Some enlightments on the parallel implementation are also given. Lastly, numerical results are presented including some features concerning the scalability of the numerical method on a parallel architecture

    Variational methods

    Get PDF
    International audienceThis contribution presents derivative-based methods for local sensitivity analysis, called Variational Sensitivity Analysis (VSA). If one defines an output called the response function, its sensitivity to inputs variations around a nominal value can be studied using derivative (gradient) information. The main issue of VSA is then to provide an efficient way of computing gradients. This contribution first presents the theoretical grounds of VSA: framework and problem statement, tangent and adjoint methods. Then it covers pratical means to compute derivatives, from naive to more sophisticated approaches, discussing their various 2 merits. Finally, applications of VSA are reviewed and some examples are presented, covering various applications fields: oceanography, glaciology, meteorology

    Investigating the role of prior and observation error correlations in improving a model forecast of forest carbon balance using Four Dimensional Variational data assimilation

    Get PDF
    Efforts to implement variational data assimilation routines with functional ecology models and land surface models have been limited, with sequential and Markov chain Monte Carlo data assimilation methods being prevalent. When data assimilation has been used with models of carbon balance, prior or “background” errors (in the initial state and parameter values) and observation errors have largely been treated as independent and uncorrelated. Correlations between background errors have long been known to be a key aspect of data assimilation in numerical weather prediction. More recently, it has been shown that accounting for correlated observation errors in the assimilation algorithm can considerably improve data assimilation results and forecasts. In this paper we implement a Four-Dimensional Variational data assimilation (4D-Var) scheme with a simple model of forest carbon balance, for joint parameter and state estimation and assimilate daily observations of Net Ecosystem CO2 Exchange (NEE) taken at the Alice Holt forest CO2 flux site in Hampshire, UK. We then investigate the effect of specifying correlations between parameter and state variables in background error statistics and the effect of specifying correlations in time between observation errors. The idea of including these correlations in time is new and has not been previously explored in carbon balance model data assimilation. In data assimilation, background and observation error statistics are often described by the background error covariance matrix and the observation error covariance matrix. We outline novel methods for creating correlated versions of these matrices, using a set of previously postulated dynamical constraints to include correlations in the background error statistics and a Gaussian correlation function to include time correlations in the observation error statistics. The methods used in this paper will allow the inclusion of time correlations between many different observation types in the assimilation algorithm, meaning that previously neglected information can be accounted for. In our experiments we assimilate a single year of NEE observations and then run a forecast for the next 14 years. We compare the results using our new correlated background and observation error covariance matrices and those using diagonal covariance matrices. We find that using the new correlated matrices reduces the root mean square error in the 14 year forecast of daily NEE by 44% decreasing from 4.22 gCm−2 day−1 to 2.38 gCm−2 day−

    An ensemble of eddy-permitting global ocean reanalyses from the MyOcean project

    Get PDF
    A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state

    The Role Of Condition-Specific Preference-Based Measures In Health Technology Assessment

    Get PDF
    A condition-specific preference-based measure (CSPBM) is a measure of health related quality of life (HRQoL) that is specific to a certain condition or disease and that can be used to obtain the quality adjustment weight of the quality adjusted life year (QALY) for use in economic models. This article provides an overview of the role of CSPBMs, the development of CSPBMs, and presents a description of existing CSPBMs in the literature. The article also provides an overview of the psychometric properties of CSPBMs in comparison to generic preference-based measures (generic PBMs), and considers the advantages and disadvantages of CSPBMs in comparison to generic PBMs. CSPBMs typically include dimensions that are important for that condition but may not be important across all patient groups. There are a large number of CSPBMs across a wide range of conditions, and these vary from covering a wide range of dimensions to more symptomatic or uni-dimensional measures. Psychometric evidence is limited but suggests that CSPBMs offer an advantage in more accurate measurement of milder health states. The mean change and standard deviation can differ for CSPBMs and generic PBMs, and this may impact on incremental cost-effectiveness ratios. CSPBMs have a useful role in HTA where a generic PBM is not appropriate, sensitive or responsive. However due to issues of comparability across different patient groups and interventions, their usage in health technology assessment is often limited to conditions where it is inappropriate to use a generic PBM or sensitivity analyses
    corecore